Probabilistic Graphical Models refers to concise representations of probability distributions using graphs. It also studies efficient algorithms for sampling distributions represented in such form. Sampling might need to be done from the joint probability distribution, the marginals or even conditional distributions. Other algorithmic questions involve computing the Maximum Likelihood Estimate (MLE), Maximum Aposteriori Estimate (MAP) etc. This topic has deep connections and applications to various fields including Theoretical Computer Science, Machine Learning, Statistical Physics, Bioinformatics etc. We will also be covering analysis of Markov Chain Monte Carlo (MCMC) Algorithms.
Broadly the course will cover four modules
Type of Eval | –Weightage |
---|---|
Quiz 1 | 10 |
Mid Sem | 15 |
Quiz 2 | 10 |
End Sem | 25 |
Assignments | 20 |
Project | 20 |
Read: For recalling basics of probability and graph theory, please go through [DB] Chapter 1, 2
Solve: - sub For any two distributions $p,q$ on $\{1, \cdots, n \}$, show that: $$\max_{A \subseteq \{1, \cdots, n\}} | p(A) - q(A) | = \frac{\sum_{i=1}^n | p(i) - q(i) |}{2}.$$ Note that the event $A$ choosen in LHS is a way of distinguishing $p$ from $q$ using $1$ sample in the best possible way. The RHS is the $\ell_1$ norm $|p - q|_1$. - sub Prove that any DAG (Directed Acyclic Graph) with finite number of vertices, has atleast one vertex with no incoming edges (ie. pointed towards it). Also show that there is atleast one vertex with no outgoing edges.
[Hint] Note that there are infinite graphs where the statement is not true. Hence you need to use the fact that the graph has only finite number of nodes in the proof.
Free parameters in distributions | Conditional Independence reduces parameters | Graph Representation | d-Connectivity and Independence
d-Connectivity | I-maps | Minimal and Perfect Imaps
Read: - [DB] Chapter 4 - [KE] https://ermongroup.github.io/cs228-notes/representation/undirected/
Read:
Read:
Tutorial II on PAC Learning
[DB] Bayesian Reasoning and Machine Learning. David Barber
[KE] Probabilistic Graphical Models, Course Notes. Volodymyr Kuleshov and Stefano Ermon
[MMSM] Handbook of Graphical Models Marloes Maathuis, Mathias Drton, Steven Lauritzen, Martin Wainwright
[KF] Probabilistic Graphical Models: Principles and Techniques. Daphne Koller and Nir Friedman, MIT Press (2009).
[EX] Probabilistic Graphical Models Eric Xing
[KM] Machine Learning: a Probabilistic Perspective by Kevin Patrick Murphy
[WJ] Graphical Models, Exponential Families, and Variational Inference Martin J. Wainwright and Michael I. Jordan
[MM] Information, Physics, and Computation Marc Mézard and Andrea Montanari